Information geometry and entropy in a stochastic epidemic rate process
نویسنده
چکیده
Epidemic models with inhomogeneous populations have been used to study major outbreaks and recently Britton and Lindenstrand [5] described the case when latency and infectivity have independent gamma distributions. They found that variability in these random variables had opposite effects on the epidemic growth rate. That rate increased with greater variability in latency but decreased with greater variability in infectivity. Here we extend their result by using the McKay bivariate gamma distribution for the joint distribution of latency and infectivity, recovering the above effects of variability but allowing possible correlation. We use methods of stochastic rate processes to obtain explicit solutions for the growth of the epidemic and the evolution of the inhomogeneity and information entropy. We obtain a closed analytic solution to the evolution of the distribution of the number of uninfected individuals as the epidemic proceeds, and a concomitant expression for the decay of entropy. The family of McKay bivariate gamma distributions has a tractable information geometry which provides a framework in which the evolution of distributions can be studied as the outbreak grows, with a natural distance structure for quantitative tracking of progress.
منابع مشابه
ENTROPY FOR DTMC SIS EPIDEMIC MODEL
In this paper at rst, a history of mathematical models is given.Next, some basic information about random variables, stochastic processesand Markov chains is introduced. As follows, the entropy for a discrete timeMarkov process is mentioned. After that, the entropy for SIS stochastic modelsis computed, and it is proved that an epidemic will be disappeared after a longtime.
متن کاملRelative Entropy Rate between a Markov Chain and Its Corresponding Hidden Markov Chain
In this paper we study the relative entropy rate between a homogeneous Markov chain and a hidden Markov chain defined by observing the output of a discrete stochastic channel whose input is the finite state space homogeneous stationary Markov chain. For this purpose, we obtain the relative entropy between two finite subsequences of above mentioned chains with the help of the definition of...
متن کاملInformation Geometry of Non-Equilibrium Processes in a Bistable System with a Cubic Damping
A probabilistic description is essential for understanding the dynamics of stochastic systems far from equilibrium, given uncertainty inherent in the systems. To compare different Probability Density Functions (PDFs), it is extremely useful to quantify the difference among different PDFs by assigning an appropriate metric to probability such that the distance increases with the difference betwe...
متن کاملThe Rate of Entropy for Gaussian Processes
In this paper, we show that in order to obtain the Tsallis entropy rate for stochastic processes, we can use the limit of conditional entropy, as it was done for the case of Shannon and Renyi entropy rates. Using that we can obtain Tsallis entropy rate for stationary Gaussian processes. Finally, we derive the relation between Renyi, Shannon and Tsallis entropy rates for stationary Gaussian proc...
متن کاملInformation geometry for bivariate distribution control
The optimal control of stochastic processes through sensor estimation of probability density functions has a geometric setting via information theory and the information metric. Information theory identifies the exponential distribution as the maximum entropy distribution if only the mean is known and the gamma distribution if also the mean logarithm is known. Previously, we used the surface re...
متن کامل